Web Survey Bibliography
Over the last decades, a tremendous increase has occurred in cross-national data production in social science research (Harkness 2008). The large-scale provision and the wide-spread use of cross-national data sets constitute a huge opportunity for the research community but also pose the challenge to develop cross-national comparable survey items (Lynn, Japec, and Lyberg 2006). At the same time, substantive researchers are increasingly aware of the necessity to understand respondents’ cognitive processes when answering a survey question (Smith et al. 2011). Recently, the method of online probing has been developed that implements probing techniques from cognitive interviewing in web surveys. In the traditional probing approach, interviewers obtain additional information by asking follow-up questions called probes (Beatty and Willis 2007). In contrast, online probing transfers probing questions as open-ended questions in the web context. It can reveal the cognitive processes of web survey participants and it helps to assess whether respondents’ interpretations of an item differ across countries (Braun et al. 2015).
The implementation of probes within web surveys offers respondents a higher level of anonymity of their answers in comparison to the laboratory situation during cognitive interviewing (Behr and Braun 2015), which potentially reduces social desirability effects in the response process (Bethlehem and Biffignandi 2012). Online probing can easily realize large samples sizes, which increases the generalizability of the results, enables an evaluation of the prevalence of problems or themes, and can explain the response patterns of specific subpopulations (Braun et al. 2015). Since all probes have to be programmed in advance, all respondents receive the same probe, and the procedure is highly standardized (Braun et al. 2015). When applied to cross-national data, online probing is a powerful tool to assess the comparability of questions. In contrast to traditional quantitative approaches to assess the equivalence of items (e.g., measurement invariance tests), online probing can explain why respondents in certain countries might misunderstand a specific item or why they adopt different perspectives when providing a response (Behr et al. 2014a).
The overarching goal of this dissertation project is to explore the potential of the method of online probing vis-à-vis other relevant methods that share similar goals (cognitive interviewing and measurement invariance tests) and as an assessment tool for single-item indicators in cross-national surveys. In particular, the dissertation addressed the following research questions: 1) Does online probing arrive at similar results than other methods? 2) Which are the strength and weaknesses of online probing in comparison to other methods? 3) How can online probing be combined with other methods in a mixed-methods approach? 3) How useful is online probing to assess the cross-national comparability of single-item indicators? Since the dissertation’s goal is to compare the methods of online probing, cognitive interviewing, and measurement invariance tests in regard to their potential to detect problematic issues at the item level, the field of national identity has been chosen as a substantive application for the method comparisons due to the existence of potentially problematic measures in a cross-national context. This dissertation focused on items from the 2013 International Social Survey Programme module on National Identity.
The first article of this dissertation (“Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?”; published in Field Methods) analyzed whether online probing and cognitive interviewing arrive at similar conclusion with regard to error detection and themes that are mentioned by respondents when applied to the same set of items (ISSP item battery on specific national pride). The study compares data from cognitive interviews conducted with 20 German respondents in April 2013 with a web survey conducted with 532 German respondents in September 2013. The article revealed that both methods share complementary strength and weaknesses. While probing answers in cognitive interviewing show indications for a higher response quality, online probing can compensate through a larger sample size. The article also provides the researcher with guidance which method is preferable in a given research situation and advocates the combination of both methods in a mixed-methods approach.
The second article of this dissertation (“Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary Tool”; forthcoming in Public Opinion Quarterly, “2016 AAPOR/WAPOR Janet A. Harkness Award” and “2016 QDET2 Monroe Sirken Innovative Paper Award for Young Scholars of Question Evaluation”) provides an example for a mixed-method approach that combines online probing with quantitative measurement invariance tests. With the examples of the concepts of constructive patriotism and nationalisms, this study explains how the combination of both methods can reveal incomparable items and countries but also explain issues related to cross-national comparability. By analyzing data from the 2013 ISSP and a web survey with 2,685 respondents from five countries, online probing discovered the reasons for missing comparability (varying lexical scope and silent misunderstanding of a key term) that was also detected during the measurement invariance tests.
Finally, the third article showed the potential of online probing for the assessment of the cross-national comparability of single-item indicators with the example of the general national pride item. Online probing provides a unique solution for the decision whether single-item indicators are equivalent because the traditional approach of measurement invariance tests presupposes multiple-indicator measures and is, therefore, inapplicable for single-item indicators. This study analyzed 2,685 probe responses from a web survey that was conducted in five countries. Online probing uncovered several potentially problematic issues and the fact that respondents in all countries associate various concepts with the general national pride item.
Therefore, the contribution of this dissertation is:
1. The insight that online probing arrives at similar results than cognitive interviewing and measurement invariance tests.
2. A clear understanding of the method’s strength and weaknesses vis-à-vis cognitive interviewing and measurement invariance tests.
3. An explanation of optimal implementations of online probing in a mixed-methods approach.
4. A demonstration of the usefulness of online probing to assess the cross-national comparability of single-item indicators.
5. But also, an assessment of the cross-national comparability of measures of national identity for substantive researchers.
Web survey bibliography - Germany (361)
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- Comparison of response patterns in different survey designs: a longitudinal panel with mixed-mode and...; 2017; Ruebsamen, N.; Akmatov, M. K.; Castell, S.; Karch, A.; Mikolajczyk, R. T.
- Mobile Research im Kontext der digitalen Transformation; 2017; Friedrich-Freksa, M.
- Kognitives Pretesting; 2017; Neuert, C.
- Grundzüge des Datenschutzrechts und aktuelle Datenschutzprobleme in der Markt- und Sozialforschung; 2017; Schweizer, A.
- Article Establishing an Open Probability-Based Mixed-Mode Panel of the General Population in Germany...; 2017; Bosnjak, M.; Dannwolf, T.; Enderle, T.; Schaurer, I.; Struminskaya, B.; Tanner, A.; Weyandt, K.
- Socially Desirable Responding in Web-Based Questionnaires: A Meta-Analytic Review of the Candor Hypothesis...; 2016; Gnambs, T.; Kaspar, K.
- Methodological Aspects of Central Left-Right Scale Placement in a Cross-national Perspective; 2016; Scholz, E.; Zuell, C.
- Predicting and Preventing Break-Offs in Web Surveys; 2016; Mittereder, F.
- Incorporating eye tracking into cognitive interviewing to pretest survey questions; 2016; Neuert, C.; Lenzner, T.
- Geht’s auch mit der Maus? – Eine Methodenstudie zu Online-Befragungen in der Jugendforschung...; 2016; Heim, R.; Konowalczyk, S.; Grgic, M.; Seyda, M.; Burrmann, U.; Rauschenbach, T.
- Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?; 2016; Meitinger, K., Behr, D.
- Device Effects - How different screen sizes affect answers in online surveys; 2016; Fisher, B.; Bernet, F.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Analyzing Cognitive Burden of Survey Questions with Paradata: A Web Survey Experiment; 2016; Hoehne, J. K.; Schlosser, S.; Krebs, D.
- Secondary Respondent Consent in the German Family Panel; 2016; Schmiedeberg, C.; Castiglioni, L.; Schroeder, J.
- Does Changing Monetary Incentive Schemes in Panel Studies Affect Cooperation? A Quasi-experiment on...; 2016; Schaurer, I.; Bosnjak, M.
- Using Cash Incentives to Help Recruitment in a Probability Based Web Panel: The Effects on Sign Up Rates...; 2016; Krieger, U.
- The Mobile Web Only Population: Socio-demographic Characteristics and Potential Bias ; 2016; Fuchs, M.; Metzler, A.
- The Impact of Scale Direction, Alignment and Length on Responses to Rating Scale Questions in a Web...; 2016; Keusch, F.; Liu, M.; Yan, T.
- Web Surveys Versus Other Survey Modes: An Updated Meta-analysis Comparing Response Rates ; 2016; Wengrzik, J.; Bosnjak, M.; Lozar Manfreda, K.
- Retrospective Measurement of Students’ Extracurricular Activities with a Self-administered Calendar...; 2016; Furthmueller, P.
- Privacy Concerns in Responses to Sensitive Questions. A Survey Experiment on the Influence of Numeric...; 2016; Bader, F., Bauer, J., Kroher, M., Riordan, P.
- Ballpoint Pens as Incentives with Mail Questionnaires – Results of a Survey Experiment; 2016; Heise, M.
- Does survey mode matter for studying electoral behaviour? Evidence from the 2009 German Longitudinal...; 2016; Bytzek, E.; Bieber, I. E.
- Forecasting proportional representation elections from non-representative expectation surveys; 2016; Graefe, A.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- Will They Stay or Will They Go? Personality Predictors of Dropout in Online Study; 2016; Nestler, S.; Thielsch, M.; Vasilev, E.; Back, M.
- Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments; 2016; Struminskaya, B.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- The impact of frequency rating scale formats on the measurement of latent variables in web surveys -...; 2015; Menold, N.; Kemper, C. J.
- Investigating response order effects in web surveys using eye tracking; 2015; Karem Hoehne, J.; Lenzner, T.
- Implementation of the forced answering option within online surveys: Do higher item response rates come...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- Translating Answers to Open-ended Survey Questions in Cross-cultural Research: A Case Study on the Interplay...; 2015; Behr, D.
- The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability...; 2015; Bosnjak, M.; Struminskaya, B.; Weyandt, K.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- Measuring Political Knowledge in Web-Based Surveys: An Experimental Validation of Visual Versus Verbal...; 2015; Munzert, S.; Selb, P.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face...; 2015; Struminskaya, B.; De Leeuw, E. D.; Kaczmirek, L.
- Higher response rates at the expense of validity? Consequences of the implementation of the ‘forced...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- A quasi-experiment on effects of prepaid versus promised incentives on participation in a probability...; 2015; Schaurer, I.; Bosnjak, M.
- Response Effects of Prenotification, Prepaid Cash, Prepaid Vouchers, and Postpaid Vouchers: An Experimental...; 2015; van Veen, F.; Goeritz, A.; Sattler, S.
- Recruiting Respondents for a Mobile Phone Panel: The Impact of Recruitment Question Wording on Cooperation...; 2015; Busse, B.; Fuchs, M.
- The Influence of the Answer Box Size on Item Nonresponse to Open-Ended Questions in a Web Survey ; 2015; Zuell, C.; Menold, N.; Koerber, S.